Chapter 19 Kolmogorov Complexity

ثبت نشده
چکیده

Kolmogorov complexity has intellectual roots in the areas of information theory, computability theory and probability theory. Despites its remarkably simple basis, it has some striking applications in Complexity Theory. The subject was developed by the Russian mathematician Andrei N. Kolmogorov (1903–1987) as an approach to the notion of random sequences and to provide an algorithmic approach to information theory [1]. A similar theory was described by Ray J. Solomonoff (1926–) in his study of inductive inference [5, 6]. For a comprehensive treatment of this subject, including the history of this subject, we refer to the excellent book of Li and Vitányi [4]. We give some intuitions that point to Kolmogorov complexity. Consider the following binary strings: (The commas are just a visual aid for parsing the strings). If these strings are being continued, we might ask whether the next bit can be predicted based on the first 16 bits shown. For instance, a natural guess is that the 17th bit of x i is i for i = 0, 1. It is less obvious 1 what ought to be the next bit in x 2. This question leads to predictive theories about strings from examples, or more generally, the problem of inductive inference. We can also ask for the " shortest descriptions " of strings such these. Thus x 0 might be described as " a string of zeros of lenghth 12 ". If x 0 is considered an infinite string, we want the shortest description of an extension of the first 12 bits. This might be " a string of all 0's ". If K(x) is the length of the shortest description of x (within some suitable formalism), then we can interprete K(x) as the " information content " of x. One can further view those strings x where K(x) is approximately |x| to be " random ". This makes connection with the theory of random strings. Kolmogorov complexity is essentially the study of the function K(x). In the rest of this introduction, we provide some background concepts and unifying notations. Bit Strings and Natural Numbers. Unless otherwise noted, " strings " shall mean bit strings and " numbers " shall mean natural numbers. We interchangeably view a bit string x ∈ {0, 1} * as a number. Let denote the empty string, and |x| denote the length of x. Note that xy can mean concatenation (if x, y …

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Chapter 3 Normalized Information Distance

The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string representation. Second, for names and abstract concepts, page count statistics from the World Wid...

متن کامل

Kolmogorov Complexity and Solovay Functions

Solovay [19] proved that there exists a computable upper bound f of the prefix-free Kolmogorov complexity function K such that f(x) = K(x) for infinitely many x. In this paper, we consider the class of computable functions f such that K(x) ≤ f(x)+O(1) for all x and f(x) ≤ K(x) + O(1) for infinitely many x, which we call Solovay functions. We show that Solovay functions present interesting conne...

متن کامل

The Kolmogorov complexity of infinite words

We present a brief survey of results on relations between the Kolmogorov complexity of infinite strings and several measures of information content (dimensions) known from dimension theory, information theory or fractal geometry. Special emphasis is laid on bounds on the complexity of strings in constructively given subsets of the Cantor space. Finally, we compare the Kolmogorov complexity to t...

متن کامل

Probability Theory II

Contents Chapter 1. Martingales, continued 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Notions of convergence for martingales 3 1.3. Uniform integrability 4 1.4. Convergence of martingales with directed index sets 6 1.5. Application: The 0-1 law of Kolmogorov 8 1.6. Continuous-time martingales 9 1.7. Tail bounds for martingales 12 1.8. Application: The Pólya urn 13 1.9. Applicat...

متن کامل

Normalized Information Distance

The normalized information distance is a universal distance measure for objects of all kinds. It is based on Kolmogorov complexity and thus uncomputable, but there are ways to utilize it. First, compression algorithms can be used to approximate the Kolmogorov complexity if the objects have a string representation. Second, for names and abstract concepts, page count statistics from the World Wid...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002